2,716 research outputs found

    Operational and Technical Updates to the Object Reentry Survival Analysis Tool

    Get PDF
    The Object Reentry Survival Analysis Tool (ORSAT) has been used in the NASA Orbital Debris Program Office for over 25 years to estimate risk due to uncontrolled reentry of spacecraft and rocket bodies. Development over the last 3 years has included: a major change to the treatment of carbon fiber- and glass fiber-reinforced plastics (CFRP and GFRP, respectively); an updated atmospheric model; a new model for computing casualty area around an impacting debris object; and a newly-implemented scheme to determine the breakup altitude of a reentry object. Software also was written to automatically perform parameter sweeps in ORSAT to allow for uncertainty quantification and sensitivity analysis for components with borderline demisability. These updates have improved the speed and fidelity of the reentry analysis performed using ORSAT, and have allowed for improved engineering understanding by estimating the uncertainty for each components survivability. A statistical model for initial conditions captures the latitude bias in population density, a large improvement over the previous inclination-based latitude-averaged models. A sample spacecraft has been analyzed with standard techniques using ORSAT 6.2.1 and again using all the updated models; we will demonstrate the variation in the total debris casualty area and overall expectation of casualty

    Delensing Gravitational Wave Standard Sirens with Shear and Flexion Maps

    Get PDF
    Supermassive black hole binary systems (SMBHB) are standard sirens -- the gravitational wave analogue of standard candles -- and if discovered by gravitational wave detectors, they could be used as precise distance indicators. Unfortunately, gravitational lensing will randomly magnify SMBHB signals, seriously degrading any distance measurements. Using a weak lensing map of the SMBHB line of sight, we can estimate its magnification and thereby remove some uncertainty in its distance, a procedure we call "delensing." We find that delensing is significantly improved when galaxy shears are combined with flexion measurements, which reduce small-scale noise in reconstructed magnification maps. Under a Gaussian approximation, we estimate that delensing with a 2D mosaic image from an Extremely Large Telescope (ELT) could reduce distance errors by about 30-40% for a SMBHB at z=2. Including an additional wide shear map from a space survey telescope could reduce distance errors by 50%. Such improvement would make SMBHBs considerably more valuable as cosmological distance probes or as a fully independent check on existing probes.Comment: 9 pages, 4 figures, submitted to MNRA

    Biological evidence supports an early and complex emergence of the Isthmus of Panama.

    Get PDF
    The linking of North and South America by the Isthmus of Panama had major impacts on global climate, oceanic and atmospheric currents, and biodiversity, yet the timing of this critical event remains contentious. The Isthmus is traditionally understood to have fully closed by ca. 3.5 million years ago (Ma), and this date has been used as a benchmark for oceanographic, climatic, and evolutionary research, but recent evidence suggests a more complex geological formation. Here, we analyze both molecular and fossil data to evaluate the tempo of biotic exchange across the Americas in light of geological evidence. We demonstrate significant waves of dispersal of terrestrial organisms at approximately ca. 20 and 6 Ma and corresponding events separating marine organisms in the Atlantic and Pacific oceans at ca. 23 and 7 Ma. The direction of dispersal and their rates were symmetrical until the last ca. 6 Ma, when northern migration of South American lineages increased significantly. Variability among taxa in their timing of dispersal or vicariance across the Isthmus is not explained by the ecological factors tested in these analyses, including biome type, dispersal ability, and elevation preference. Migration was therefore not generally regulated by intrinsic traits but more likely reflects the presence of emergent terrain several millions of years earlier than commonly assumed. These results indicate that the dramatic biotic turnover associated with the Great American Biotic Interchange was a long and complex process that began as early as the Oligocene-Miocene transition

    Cosmological constraints from COMBO-17 using 3D weak lensing

    Get PDF
    We present the first application of the 3D cosmic shear method developed in Heavens et al. (2006) and the geometric shear-ratio analysis developed in Taylor et al. (2006), to the COMBO-17 data set. 3D cosmic shear has been used to analyse galaxies with redshift estimates from two random COMBO-17 fields covering 0.52 square degrees in total, providing a conditional constraint in the (sigma_8, Omega_m) plane as well as a conditional constraint on the equation of state of dark energy, parameterised by a constant w= p/rho c^2. The (sigma_8, Omega_m) plane analysis constrained the relation between sigma_8 and Omega_m to be sigma_8(Omega_m/0.3)^{0.57 +- 0.19}=1.06 +0.17 -0.16, in agreement with a 2D cosmic shear analysis of COMBO-17. The 3D cosmic shear conditional constraint on w using the two random fields is w=-1.27 +0.64 -0.70. The geometric shear-ratio analysis has been applied to the A901/2 field, which contains three small galaxy clusters. Combining the analysis from the A901/2 field, using the geometric shear-ratio analysis, and the two random fields, using 3D cosmic shear, w is conditionally constrained to w=-1.08 +0.63 -0.58. The errors presented in this paper are shown to agree with Fisher matrix predictions made in Heavens et al. (2006) and Taylor et al. (2006). When these methods are applied to large datasets, as expected soon from surveys such as Pan-STARRS and VST-KIDS, the dark energy equation of state could be constrained to an unprecedented degree of accuracy.Comment: 10 pages, 4 figures. Accepted to MNRA

    The North Atlantic subpolar gyre in four high resolution models

    Get PDF
    The authors present the first quantitative comparison between new velocity datasets and high-resolution models in the North Atlantic subpolar gyre [1/10° Parallel Ocean Program model (POPNA10), Miami Isopycnic Coordinate Ocean Model (MICOM), ° Atlantic model (ATL6), and Family of Linked Atlantic Ocean Model Experiments (FLAME)]. At the surface, the model velocities agree generally well with World Ocean Circulation Experiment (WOCE) drifter data. Two noticeable exceptions are the weakness of the East Greenland coastal current in models and the presence in the surface layers of a strong southwestward East Reykjanes Ridge Current. At depths, the most prominent feature of the circulation is the boundary current following the continental slope. In this narrow flow, it is found that gridded float datasets cannot be used for a quantitative comparison with models. The models have very different patterns of deep convection, and it is suggested that this could be related to the differences in their barotropic transport at Cape Farewell. Models show a large drift in watermass properties with a salinization of the Labrador Sea Water. The authors believe that the main cause is related to horizontal transports of salt because models with different forcing and vertical mixing share the same salinization problem. A remarkable feature of the model solutions is the large westward transport over Reykjanes Ridge [10 Sv (Sv ≡ 106 m3 s−1) or more

    Philosophy and Science in Leibniz

    Get PDF
    This paper explores the question of Leibniz’s contribution to the rise of modern ‘science’. To be sure, it is now generally agreed that the modern category of ‘science’ did not exist in the early modern period. At the same time, this period witnessed a very important stage in the process from which modern science eventually emerged. My discussion will be aimed at uncovering the new enterprise, and the new distinctions which were taking shape in the early modern period under the banner of the old Aristotelian terminology. I will argue that Leibniz begins to theorize a distinction between physics and metaphysics that tracks our distinction between the autonomous enterprise of science in its modern meaning, and the enterprise of philosophy. I will try to show that, for Leibniz, physics proper is the study of natural phenomena in mathematical and mechanical terms without recourse for its explanations to metaphysical notions. This autonomy, however, does not imply for Leibniz that physics can say on its own all that there is to be said about the natural world. Quite the opposite. Leibniz inherits from the Aristotelian tradition the view that physics needs metaphysical roots or a metaphysical grounding. For Leibniz, what is ultimately real is reached by metaphysics, not by physics. This is, in my view, Leibniz’s chief insight: the new mathematical physics is an autonomous enterprise which offers its own kind of explanations but does not exhaust what can (and should) be said about the natural world

    Perfect state distinguishability and computational speedups with postselected closed timelike curves

    Get PDF
    Bennett and Schumacher's postselected quantum teleportation is a model of closed timelike curves (CTCs) that leads to results physically different from Deutsch's model. We show that even a single qubit passing through a postselected CTC (P-CTC) is sufficient to do any postselected quantum measurement, and we discuss an important difference between "Deutschian" CTCs (D-CTCs) and P-CTCs in which the future existence of a P-CTC might affect the present outcome of an experiment. Then, based on a suggestion of Bennett and Smith, we explicitly show how a party assisted by P-CTCs can distinguish a set of linearly independent quantum states, and we prove that it is not possible for such a party to distinguish a set of linearly dependent states. The power of P-CTCs is thus weaker than that of D-CTCs because the Holevo bound still applies to circuits using them regardless of their ability to conspire in violating the uncertainty principle. We then discuss how different notions of a quantum mixture that are indistinguishable in linear quantum mechanics lead to dramatically differing conclusions in a nonlinear quantum mechanics involving P-CTCs. Finally, we give explicit circuit constructions that can efficiently factor integers, efficiently solve any decision problem in the intersection of NP and coNP, and probabilistically solve any decision problem in NP. These circuits accomplish these tasks with just one qubit traveling back in time, and they exploit the ability of postselected closed timelike curves to create grandfather paradoxes for invalid answers.Comment: 15 pages, 4 figures; Foundations of Physics (2011

    Cosmic Shear Analysis with CFHTLS Deep data

    Full text link
    We present the first cosmic shear measurements obtained from the T0001 release of the Canada-France-Hawaii Telescope Legacy Survey. The data set covers three uncorrelated patches (D1, D3 and D4) of one square degree each observed in u*, g', r', i' and z' bands, out to i'=25.5. The depth and the multicolored observations done in deep fields enable several data quality controls. The lensing signal is detected in both r' and i' bands and shows similar amplitude and slope in both filters. B-modes are found to be statistically zero at all scales. Using multi-color information, we derived a photometric redshift for each galaxy and separate the sample into medium and high-z galaxies. A stronger shear signal is detected from the high-z subsample than from the low-z subsample, as expected from weak lensing tomography. While further work is needed to model the effects of errors in the photometric redshifts, this results suggests that it will be possible to obtain constraints on the growth of dark matter fluctuations with lensing wide field surveys. The various quality tests and analysis discussed in this work demonstrate that MegaPrime/Megacam instrument produces excellent quality data. The combined Deep and Wide surveys give sigma_8= 0.89 pm 0.06 assuming the Peacock & Dodds non-linear scheme and sigma_8=0.86 pm 0.05 for the halo fitting model and Omega_m=0.3. We assumed a Cold Dark Matter model with flat geometry. Systematics, Hubble constant and redshift uncertainties have been marginalized over. Using only data from the Deep survey, the 1 sigma upper bound for w_0, the constant equation of state parameter is w_0 < -0.8.Comment: 14 pages, 16 figures, accepted A&

    Constraining warm dark matter with cosmic shear power spectra

    Full text link
    We investigate potential constraints from cosmic shear on the dark matter particle mass, assuming all dark matter is made up of light thermal relic particles. Given the theoretical uncertainties involved in making cosmological predictions in such warm dark matter scenarios we use analytical fits to linear warm dark matter power spectra and compare (i) the halo model using a mass function evaluated from these linear power spectra and (ii) an analytical fit to the non-linear evolution of the linear power spectra. We optimistically ignore the competing effect of baryons for this work. We find approach (ii) to be conservative compared to approach (i). We evaluate cosmological constraints using these methods, marginalising over four other cosmological parameters. Using the more conservative method we find that a Euclid-like weak lensing survey together with constraints from the Planck cosmic microwave background mission primary anisotropies could achieve a lower limit on the particle mass of 2.5 keV.Comment: 26 pages, 9 figures, minor changes to match the version accepted for publication in JCA

    First detection of galaxy-galaxy-galaxy lensing in RCS. A new tool for studying the matter environment of galaxy pairs

    Full text link
    The weak gravitational lensing effect, small coherent distortions of galaxy images by means of a gravitational tidal field, can be used to study the relation between the matter and galaxy distribution. In this context, weak lensing has so far only been used for considering a second-order correlation function that relates the matter density and galaxy number density as a function of separation. We implement two new, third-order correlation functions that have recently been suggested in the literature, and apply them to the Red-Sequence Cluster Survey. We demonstrate that it is possible, even with already existing data, to make significant measurements of third-order lensing correlations. We develop an optimised computer code for the correlation functions. To test its reliability a set of tests are performed. The correlation functions are transformed to aperture statistics, which allow easy tests for remaining systematics in the data. In order to further verify the robustness of our measurement, the signal is shown to vanish when randomising the source ellipticities. Finally, the lensing signal is compared to crude predictions based on the halo-model. On angular scales between roughly 1 arcmin and 11 arcmin a significant third-order correlation between two lens positions and one source ellipticity is found. We discuss this correlation function as a novel tool to study the average matter environment of pairs of galaxies. Correlating two source ellipticities and one lens position yields a less significant but nevertheless detectable signal on a scale of 4 arcmin. Both signals lie roughly within the range expected by theory which supports their cosmological origin.[ABRIDGED]Comment: 15 pages, 12 figures, accepted by A&A; minor change
    corecore